Goto

Collaborating Authors

 Merritt Island


NASA photos show incredible moment Orion splashed back down to Earth

Daily Mail - Science & tech

NASA has shared new photos of the incredible moment the Orion space capsule returned to Earth after flying around the moon. The unmanned Orion capsule splashed down in the Pacific Ocean, west of Baja California, at 09:40 PST (17:40 GMT) on Sunday. Since its launch in mid-November, it has travelled more than 1.4 million miles on a path around the moon and back to Earth. The images show before and after the historic point of impact, which marks the first part of Artemis – NASA's successor to the Apollo programme in the 1960s and 1970s. NASA's Orion Capsule descends toward splash down after a successful uncrewed Artemis 1 Moon Mission on December 11, 2022 seen from aboard the USS Portland in the Pacific Ocean off the coast of Baja California, Mexico Artemis 1 is NASA's uncrewed flight test of the Space Launch System (SLS) rocket and Orion spacecraft, which launched on November 16 from Kennedy Space Center, Merritt Island, Florida.


NASA's Artemis 1 spacecraft breaks a record set by Apollo 13 in 1970

Daily Mail - Science & tech

NASA's Artemis programme is already breaking records, less than two weeks after its very first spaceflight launched. The agency has confirmed its Artemis 1 Orion capsule smashed the record for the furthest distance travelled from Earth by any craft designed to carry humans. At 08:40 EST (13:40 GMT) on Saturday (November 26), Orion reached 248,655 miles from Earth, beating the record set by Apollo 13 in April 1970. Then, at 16:06 EST (21:06 GMT) on Saturday, it reached the farthest point in its orbit – a maximum distance of 268,553 miles. Artemis 1 is an uncrewed test flight for NASA's Artemis programme, comprising the Orion spacecraft, Space Launch System (SLS) rocket.


Span Selection Pre-training for Question Answering

Glass, Michael, Gliozzo, Alfio, Chakravarti, Rishav, Ferritto, Anthony, Pan, Lin, Bhargav, G P Shrivatsa, Garg, Dinesh, Sil, Avirup

arXiv.org Artificial Intelligence

BERT (Bidirectional Encoder Representations from Transformers) and related pre-trained Transformers have provided large gains across many language understanding tasks, achieving a new state-of-the-art (SOTA). BERT is pre-trained on two auxiliary tasks: Masked Language Model and Next Sentence Prediction. In this paper we introduce a new pre-training task inspired by reading comprehension and an effort to avoid encoding general knowledge in the transformer network itself. We find significant and consistent improvements over both BERT-BASE and BERT-LARGE on multiple reading comprehension (MRC) and paraphrasing datasets. Specifically, our proposed model has strong empirical evidence as it obtains SOTA results on Natural Questions, a new benchmark MRC dataset, outperforming BERT-LARGE by 3 F1 points on short answer prediction. We also establish a new SOTA in HotpotQA, improving answer prediction F1 by 4 F1 points and supporting fact prediction by 1 F1 point. Moreover, we show that our pre-training approach is particularly effective when training data is limited, improving the learning curve by a large amount.